293 research outputs found

    The aerospace energy systems laboratory: Hardware and software implementation

    Get PDF
    For many years NASA Ames Research Center, Dryden Flight Research Facility has employed automation in the servicing of flight critical aircraft batteries. Recently a major upgrade to Dryden's computerized Battery Systems Laboratory was initiated to incorporate distributed processing and a centralized database. The new facility, called the Aerospace Energy Systems Laboratory (AESL), is being mechanized with iAPX86 and iAPX286 hardware running iRMX86. The hardware configuration and software structure for the AESL are described

    An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    Get PDF
    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described

    The Aerospace Energy Systems Laboratory: A BITBUS networking application

    Get PDF
    The NASA Ames-Dryden Flight Research Facility developed a computerized aircraft battery servicing facility called the Aerospace Energy Systems Laboratory (AESL). This system employs distributed processing with communications provided by a 2.4-megabit BITBUS local area network. Customized handlers provide real time status, remote command, and file transfer protocols between a central system running the iRMX-II operating system and ten slave stations running the iRMX-I operating system. The hardware configuration and software components required to implement this BITBUS application are required

    Rethinking our understanding of the pathogenesis of necrotic enteritis in chickens

    Get PDF
    For decades, low doses of antibiotics have been used widely in animal production to promote growth. However, there is a trend to reduce this use of antibiotics in feedstuffs, and legislation is now in place in Europe to prohibit their use in this way. As a consequence, economically important diseases, such as necrotic enteritis (NE) of chickens, that are caused by Clostridium perfringens have become more prevalent. Recent research is creating a paradigm shift in our understanding of the pathogenesis of NE and is now providing information that will be necessary to monitor and control the incidence of NE in poultry

    Analysis of stratospheric ozone, temperature, and minor constituent data

    Get PDF
    The objective of this research is to use available satellite measurements of temperature and constituent concentrations to test the conceptual picture of stratospheric chemistry and transport. This was originally broken down into two sub-goals: first, to use the constituent data to search for critical tests of our understanding of stratospheric chemistry and second, to examine constituent transport processes emphasizing interactions with chemistry on various time scales. A third important goal which has evolved is to use the available solar backscattered ultraviolet (SBUV) and Total Ozone Mapping Spectrometer (TOMS) data from Nimbus 7 to describe the morphology of recent changes in Antarctic and global ozone with emphasis on searching for constraints to theories. The major effort now being pursued relative to the two original goals is our effort as a theoretical team for the Arctic Airborne Stratospheric Expedition (AASE). Our effort for the AASE is based on the 3D transport and chemistry model at Goddard. Our goal is to use this model to place the results from the mission data in a regional and global context. Specifically, we set out to make model runs starting in late December and running through March of 1989, both with and without heterogeneous chemistry. The transport is to be carried out using dynamical fields from a 4D data assimilation model being developed under separate funding from this task. We have successfully carried out a series of single constituent transport experiments. One of the things demonstrated by these runs was the difficulty in obtaining observed low N2O abundances in the vortex without simultaneously obtaining very high ozone values. Because the runs start in late December, this difficulty arises in the attempt to define consistent initial conditions for the 3D model. To accomplish a consistent set of initial conditions, we are using the 2D photochemistry-transport model of Jackman and Douglass and mapping in potential temperature, potential vorticity space as developed by Schoeberl and coworkers

    A decision tree algorithm for investigation of model biases related to dynamical cores and physical parameterizations

    Full text link
    An object‐based evaluation method using a pattern recognition algorithm (i.e., classification trees) is applied to the simulated orographic precipitation for idealized experimental setups using the National Center of Atmospheric Research (NCAR) Community Atmosphere Model (CAM) with the finite volume (FV) and the Eulerian spectral transform dynamical cores with varying resolutions. Daily simulations were analyzed and three different types of precipitation features were identified by the classification tree algorithm. The statistical characteristics of these features (i.e., maximum value, mean value, and variance) were calculated to quantify the difference between the dynamical cores and changing resolutions. Even with the simple and smooth topography in the idealized setups, complexity in the precipitation fields simulated by the models develops quickly. The classification tree algorithm using objective thresholding successfully detected different types of precipitation features even as the complexity of the precipitation field increased. The results show that the complexity and the bias introduced in small‐scale phenomena due to the spectral transform method of CAM Eulerian spectral dynamical core is prominent, and is an important reason for its dissimilarity from the FV dynamical core. The resolvable scales, both in horizontal and vertical dimensions, have significant effect on the simulation of precipitation. The results of this study also suggest that an efficient and informative study about the biases produced by GCMs should involve daily (or even hourly) output (rather than monthly mean) analysis over local scales.Key PointsThe complexity and the bias introduced in small‐scale phenomena due to the spectral transform method of CAM Eulerian spectral dynamical core are prominentThe classification tree algorithm with objective thresholding is successful in detecting different types of precipitation features with high spatial complexityAn efficient and informative study about the biases produced by GCMs should involve daily (or hourly) output (rather than monthly mean) analysis over local scalesPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/136040/1/jame20331.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/136040/2/jame20331_am.pd

    Software Testing and Verification in Climate Model Development

    Get PDF
    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties
    corecore